-
Notifications
You must be signed in to change notification settings - Fork 15
test(examples): Add some examples leveraging pydantic-AI and other chatlas alternatives #66
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
pkg-py/tests/playwright/chat/langchain/structured_output/app.py
Outdated
Show resolved
Hide resolved
pkg-py/tests/playwright/chat/llama-index/rag_with_chatlas/rag_example.py
Outdated
Show resolved
Hide resolved
pkg-py/tests/playwright/chat/llama-index/structured_output/app.py
Outdated
Show resolved
Hide resolved
pkg-py/tests/playwright/chat/llama-index/structured_output/app.py
Outdated
Show resolved
Hide resolved
pkg-py/tests/playwright/chat/llama-index/structured_output/app.py
Outdated
Show resolved
Hide resolved
pkg-py/tests/playwright/chat/llama-index/structured_output/app.py
Outdated
Show resolved
Hide resolved
pkg-py/tests/playwright/chat/pydantic-ai/data-sci-adventure/app.py
Outdated
Show resolved
Hide resolved
pkg-py/tests/playwright/chat/pydantic-ai/structured_output/app.py
Outdated
Show resolved
Hide resolved
# An async generator function to stream the response from the Pydantic AI agent | ||
async def pydantic_stream_generator(user_input: str): | ||
async with chat_client.run_stream(user_input) as result: | ||
async for chunk in result.stream_text(delta=True): | ||
yield chunk |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I just realized this won't retain message history. I think the simplest way to do that now is supposed to be something like:
from pydantic_ai import Agent
from pydantic_ai.messages import ModelMessage
def retain_messages(
messages: "list[ModelMessage]",
) -> "list[ModelMessage]":
return messages
chat_client = Agent(
"openai:o4-mini",
system_prompt="You are a helpful assistant.",
history_processors=[retain_messages],
)
Unfortunately though, that is leading to an error for me. Does it work for you? Seems like we might want to report it if that doesn't in fact work
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
No, if I copy what you shared in the code, it throws an error shiny.types.NotifyException: Error in Chat('chat'): name 'ModelMessage' is not defined
import os
from dotenv import load_dotenv
from pydantic_ai import Agent
from pydantic_ai.messages import ModelMessage
from shiny.express import ui
_ = load_dotenv()
def retain_messages(
messages: "list[ModelMessage]",
) -> "list[ModelMessage]":
return messages
chat_client = Agent(
"openai:gpt-4.1-nano-2025-04-14",
system_prompt="You are a helpful assistant.",
history_processors=[retain_messages],
)
# Set some Shiny page options
ui.page_opts(
title="OpenAI with Pydantic AI",
fillable=True,
fillable_mobile=True,
)
# Create and display a Shiny chat component
chat = ui.Chat(
id="chat",
messages=[
"Hello! I am an assistant powered by Pydantic AI. Ask me anything you'd like to know or do.",
],
)
chat.ui()
# Generate a response when the user submits a message
@chat.on_user_submit
async def handle_user_input(user_input: str):
stream = pydantic_stream_generator(user_input)
await chat.append_message_stream(stream)
# An async generator function to stream the response from the Pydantic AI agent
async def pydantic_stream_generator(user_input: str):
async with chat_client.run_stream(user_input) as result:
async for chunk in result.stream_text(delta=True):
yield chunk
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
that said, I got a version of the example you suggested working by storing the conversation history as a reactive Val
from typing import List
from dotenv import load_dotenv
from pydantic_ai import Agent
from pydantic_ai.messages import ModelMessage
from shiny import reactive
from shiny.express import ui
_ = load_dotenv()
chat_client = Agent(
"openai:gpt-4.1-nano-2025-04-14",
system_prompt="You are a helpful assistant.",
)
conversation_history = reactive.value(list[ModelMessage]([]))
ui.page_opts(
title="Hello OpenAI Chat",
fillable=True,
fillable_mobile=True,
)
chat = ui.Chat(
id="chat",
messages=["Hello! How can I help you today?"],
)
chat.ui()
@chat.on_user_submit
async def handle_user_input(user_input: str):
current_history = conversation_history.get()
stream = pydantic_stream_generator(user_input, current_history)
await chat.append_message_stream(stream)
async def pydantic_stream_generator(
user_input: str, current_history: List[ModelMessage]
):
message_history = current_history if current_history else None
async with chat_client.run_stream(
user_input, message_history=message_history
) as result:
async for chunk in result.stream_text(delta=True):
yield chunk
conversation_history.set(result.all_messages())
Co-authored-by: Carson Sievert <[email protected]>
Co-authored-by: Carson Sievert <[email protected]>
Co-authored-by: Carson Sievert <[email protected]>
pkg-py/tests/playwright/chat/langchain/structured_output/app.py
Outdated
Show resolved
Hide resolved
pkg-py/tests/playwright/chat/llama-index/rag_with_chatlas/app.py
Outdated
Show resolved
Hide resolved
pkg-py/tests/playwright/chat/llama-index/structured_output/app.py
Outdated
Show resolved
Hide resolved
pkg-py/tests/playwright/chat/llama-index/structured_output/app.py
Outdated
Show resolved
Hide resolved
pkg-py/tests/playwright/chat/llm_package/structured_output/app.py
Outdated
Show resolved
Hide resolved
pkg-py/tests/playwright/chat/llm_package/tool_calling_ex/app.py
Outdated
Show resolved
Hide resolved
pkg-py/tests/playwright/chat/llm_package/tool_calling_ex/app.py
Outdated
Show resolved
Hide resolved
pkg-py/tests/playwright/chat/pydantic-ai/structured_output/app.py
Outdated
Show resolved
Hide resolved
Co-authored-by: Carson Sievert <[email protected]>
Co-authored-by: Carson Sievert <[email protected]>
Co-authored-by: Carson Sievert <[email protected]>
Standardizes chat UI initialization by moving message setup into chat.ui() calls and enhances assistant messages with suggestions and clearer instructions across multiple Playwright chat test apps. Also updates imports and agent usage for consistency and clarity.
Moved the import of Context to group it with related imports, improving code readability and maintaining a consistent import order.
Thanks, these examples are looking good!
One idea to help with this would be to create a "framework playground app", that has:
And, once we have that, we could have a new template under https://shiny.posit.co/py/templates/#generative-ai for the playground app. I think it'd also make sense for each example app to be a "template" within https://github.com/posit-dev/py-shiny-templates/tree/main/gen-ai. That way, we could leverage the infrastructure in that repo for deployment. Does that seems like something you'd have the bandwidth to take on? It's not something I'd view as super urgent to deliver on, but would be great to have some slow steady progress on. |
Sure, I'll work on this or next week. |
This pull request adds the following examples in shinychat
PydanticAI
LlamaIndex
chatlas
llm
Langchain